What is cache-manager?
The cache-manager npm package is a flexible caching library for Node.js applications, which supports a variety of storage solutions and provides a uniform API to interact with different caching mechanisms. It allows for easy integration and switching between different cache stores without changing the underlying application code.
What are cache-manager's main functionalities?
Caching and Retrieving Data
This feature allows you to cache data in memory and retrieve it using a key. The 'set' method stores the value, and the 'get' method retrieves it. The 'ttl' option specifies the time-to-live in seconds.
{"const cacheManager = require('cache-manager');
const memoryCache = cacheManager.caching({ store: 'memory', max: 100, ttl: 10/*seconds*/ });
// Now set a value
memoryCache.set('myKey', 'myValue', { ttl: 5 }, (err) => {
if (err) { throw err; }
// Get the value
memoryCache.get('myKey', (error, result) => {
console.log(result);
// >> 'myValue'
});
});
}
Cache Store Agnosticism
Cache-manager supports different stores such as memory, Redis, and more. This feature allows you to switch between different cache stores seamlessly. The example shows how to use Redis as the cache store.
{"const cacheManager = require('cache-manager');
const redisStore = require('cache-manager-redis-store');
const redisCache = cacheManager.caching({ store: redisStore, host: 'localhost', port: 6379, auth_pass: 'XXXX', db: 0, ttl: 600 });
// Listen for redis ready event
redisCache.store.events.on('redisReady', () => {
console.log('Redis is ready');
});
// Listen for redis error event
redisCache.store.events.on('redisError', (error) => {
console.error('Redis error', error);
});
}
Multi-Level Caching
Cache-manager allows for multi-level caching, where you can have a hierarchy of cache stores. Data is first checked in the fastest cache (e.g., memory), and if not found, it falls back to slower caches (e.g., Redis).
{"const cacheManager = require('cache-manager');
const memoryCache = cacheManager.caching({ store: 'memory', max: 100, ttl: 10 });
const redisCache = cacheManager.caching({ store: require('cache-manager-redis-store'), ttl: 600 });
const multiCache = cacheManager.multiCaching([memoryCache, redisCache]);
multiCache.set('foo', 'bar', { ttl: 5 }, (err) => {
if (err) { throw err; }
multiCache.get('foo', (error, result) => {
console.log(result);
// >> 'bar'
});
});
}
Other packages similar to cache-manager
node-cache
node-cache is an in-memory caching package similar to cache-manager's memory store. It offers a simple and fast caching solution but does not support multiple backends or a tiered caching system.
lru-cache
lru-cache is an in-memory cache that implements the LRU (Least Recently Used) eviction policy. Unlike cache-manager, it is specifically tailored for LRU caching and does not support multiple storage backends.
keyv
keyv is a simple key-value storage with support for multiple backends, including Redis, MongoDB, SQLite, and more. It provides a unified interface across different stores but does not have built-in support for multi-level caching.
cache-manager
Flexible NodeJS cache module
A cache module for nodejs that allows easy wrapping of functions in cache, tiered caches, and a consistent interface. This module is now part of the Cacheable project.
Table of Contents
Features
- Made with Typescript and compatible with ESModules
- Easy way to wrap any function in cache.
- Tiered caches -- data gets stored in each cache and fetched from the highest.
priority cache(s) first.
- Use any cache you want, as long as it has the same API.
- 100% test coverage via vitest.
Installation
pnpm install cache-manager
Usage Examples
Single Store
import { caching } from 'cache-manager';
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 ,
});
const ttl = 5 * 1000;
await memoryCache.set('foo', 'bar', ttl);
console.log(await memoryCache.get('foo'));
await memoryCache.del('foo');
console.log(await memoryCache.get('foo'));
const getUser = (id: string) => new Promise.resolve({ id: id, name: 'Bob' });
const userId = 123;
const key = 'user_' + userId;
console.log(await memoryCache.wrap(key, () => getUser(userId), ttl));
See unit tests in test/caching.test.ts
for more information.
Example setting/getting several keys with mset() and mget()
await memoryCache.store.mset(
[
['foo', 'bar'],
['foo2', 'bar2'],
],
ttl,
);
console.log(await memoryCache.store.mget('foo', 'foo2'));
await memoryCache.store.mdel('foo', 'foo2');
Custom Stores
You can use your own custom store by creating one with the same API as the built-in memory stores.
Create single cache store synchronously
As caching()
requires async functionality to resolve some stores, this is not well-suited to use for default function/constructor parameters etc.
If you need to create a cache store synchronously, you can instead use createCache()
:
import { createCache, memoryStore } from 'cache-manager';
const memoryCache = createCache(memoryStore({
max: 100,
ttl: 10 * 1000 ,
}));
function myService(cache = createCache(memoryStore())) {}
const DEFAULT_CACHE = createCache(memoryStore(), { ttl: 60 * 1000 });
class MyService {
constructor(private cache = DEFAULT_CACHE) {}
}
Multi-Store
import { multiCaching } from 'cache-manager';
const multiCache = multiCaching([memoryCache, someOtherCache]);
const userId2 = 456;
const key2 = 'user_' + userId;
const ttl = 5;
await multiCache.set('foo2', 'bar2', ttl);
console.log(await multiCache.get('foo2'));
await multiCache.del('foo2');
await multiCache.mset(
[
['foo', 'bar'],
['foo2', 'bar2'],
],
ttl
);
console.log(await multiCache.mget('key', 'key2'));
await multiCache.mdel('foo', 'foo2');
See unit tests in test/multi-caching.test.ts
for more information.
Cache Manager Options
The caching
function accepts an options object as the second parameter. The following options are available:
- ttl: The time to live in milliseconds. This is the maximum amount of time that an item can be in the cache before it is removed.
- refreshThreshold: discussed in details below.
- isCacheable: a function to determine whether the value is cacheable or not.
- onBackgroundRefreshError: a function to handle errors that occur during background refresh.
import { caching } from 'cache-manager';
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 ,
shouldCloneBeforeSet: false,
});
When creating a memory store, you also get these addition options:
- max: The maximum number of items that can be stored in the cache. If the cache is full, the least recently used item is removed.
- shouldCloneBeforeSet: If true, the value will be cloned before being set in the cache. This is set to
true
by default.
Refresh cache keys in background
Both the caching
and multicaching
modules support a mechanism to refresh expiring cache keys in background when using the wrap
function.
This is done by adding a refreshThreshold
attribute while creating the caching store or passing it to the wrap
function.
If refreshThreshold
is set and after retrieving a value from cache the TTL will be checked.
If the remaining TTL is less than refreshThreshold
, the system will update the value asynchronously,
following same rules as standard fetching. In the meantime, the system will return the old value until expiration.
NOTES:
- In case of multicaching, the store that will be checked for refresh is the one where the key will be found first (highest priority).
- If the threshold is low and the worker function is slow, the key may expire and you may encounter a racing condition with updating values.
- The background refresh mechanism currently does not support providing multiple keys to
wrap
function. - If no
ttl
is set for the key, the refresh mechanism will not be triggered. For redis, the ttl
is set to -1 by default.
For example, pass the refreshThreshold to caching
like this:
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 ,
refreshThreshold: 3 * 1000 ,
onBackgroundRefreshError: (error) => { }
});
When a value will be retrieved from Redis with a TTL minor than 3sec, the value will be updated in the background.
Error Handling
multiCaching
now does not throw errors by default. Instead, all errors are evented through the error
event. Here is an example on how to use it:
const multicache = await multiCaching([memoryCache, someOtherCache]);
multicache.on('error', (error) => {
console.error('Cache error:', error);
});
Using non-blocking set with wrap
By default, when using wrap
the value is set in the cache before the function returns.
While this behaviour can prevent additional calls to downstream resources, it can also slow down the response time.
This can be changed by setting the nonBlockingSet
option to true
.
Doing will make the function return before the value is set in the cache.
The setting applies to both single and multi caches.
cache.wrap('key', () => fetchValue(), 1000, 500, {nonBlockingSet: true});
Express Middleware
This example sets up an Express application with a caching mechanism using cache-manager. The cacheMiddleware checks if the response for a request is already cached and returns it if available. If not, it proceeds to the route handler, caches the response, and then returns it. This helps to reduce the load on the server by avoiding repeated processing of the same requests.
import { caching } from 'cache-manager';
import express from 'express';
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000
});
const app = express();
const port = 3000;
const cacheMiddleware = async (req, res, next) => {
const key = req.originalUrl;
try {
const cachedResponse = await memoryCache.get(key);
if (cachedResponse) {
return res.send(cachedResponse);
} else {
res.sendResponse = res.send;
res.send = async (body) => {
await memoryCache.set(key, body);
res.sendResponse(body);
};
next();
}
} catch (err) {
next(err);
}
};
app.get('/data', cacheMiddleware, (req, res) => {
setTimeout(() => {
res.send({ data: 'This is some data', timestamp: new Date() });
}, 2000);
});
app.listen(port, () => {
console.log(`Server is running on http://localhost:${port}`);
});
Store Engines
Official and updated to last version
Third party
Contribute
If you would like to contribute to the project, please read how to contribute here CONTRIBUTING.md.
License
cache-manager is licensed under the MIT license.